85 research outputs found

    Infinitesimal Probabilities

    Get PDF
    Non-Archimedean probability functions allow us to combine regularity with perfect additivity. We discuss the philosophical motivation for a particular choice of axioms for a non-Archimedean probability theory and answer some philosophical objections that have been raised against infinitesimal probabilities in general

    The Snow White problem

    Get PDF
    The SnowWhite problem is introduced to demonstrate how learning something of which one could not have learnt the opposite (due to observer selection bias) can change an agent’s probability assignment. This helps us to analyse the Sleeping Beauty problem, which is deconstructed as a combinatorial engine and a subjective wrapper. The combinatorial engine of the problem is analogous to Bertrand’s boxes paradox and can be solved with standard probability theory. The subjective wrapper is clarified using the Snow White problem. Sample spaces for all three problems are presented. The conclusion is that subjectivity plays no irreducible role in solving the Sleeping Beauty problem and that no reference to centered worlds is required to provide the answer

    Uniform probability in cosmology

    Full text link
    Problems with uniform probabilities on an infinite support show up in contemporary cosmology. This paper focuses on the context of inflation theory, where it complicates the assignment of a probability measure over pocket universes. The measure problem in cosmology, whereby it seems impossible to pick out a uniquely well-motivated measure, is associated with a paradox that occurs in standard probability theory and crucially involves uniformity on an infinite sample space. This problem has been discussed by physicists, albeit without reference to earlier work on this topic. The aim of this article is both to introduce philosophers of probability to these recent discussions in cosmology and to familiarize physicists and philosophers working on cosmology with relevant foundational work by Kolmogorov, de Finetti, Jaynes, and other probabilists. As such, the main goal is not to solve the measure problem, but to clarify the exact origin of some of the current obstacles. The analysis of the assumptions going into the paradox indicates that there exist multiple ways of dealing consistently with uniform probabilities on infinite sample spaces. Taking a pluralist stance towards the mathematical methods used in cosmology shows there is some room for progress with assigning probabilities in cosmological theories.Comment: 16 pages; accepted for publication in Studies in History and Philosophy of Scienc

    Models and Simulations in Material Science: Two Cases Without Error Bars

    Full text link
    We discuss two research projects in material science in which the results cannot be stated with an estimation of the error: a spectro- scopic ellipsometry study aimed at determining the orientation of DNA molecules on diamond and a scanning tunneling microscopy study of platinum-induced nanowires on germanium. To investigate the reliability of the results, we apply ideas from the philosophy of models in science. Even if the studies had reported an error value, the trustworthiness of the result would not depend on that value alone.Comment: 20 pages, 2 figure

    Degrees of riskiness, falsifiability, and truthlikeness. A neo-Popperian account applicable to probabilistic theories

    Get PDF
    In this paper, we take a fresh look at three Popperian concepts: riskiness, falsifiability, and truthlikeness (or verisimilitude) of scientific hypotheses or theories. First, we make explicit the dimensions that underlie the notion of riskiness. Secondly, we examine if and how degrees of falsifiability can be defined, and how they are related to various dimensions of the concept of riskiness as well as the experimental context. Thirdly, we consider the relation of riskiness to (expected degrees of) truthlikeness. Throughout, we pay special attention to probabilistic theories and we offer a tentative, quantitative account of verisimilitude for probabilistic theories.Comment: 41 pages; 3 figures; accepted for publication in Synthes

    Degrees of freedom

    Get PDF
    Human freedom is in tension with nomological determinism and with statistical determinism. The goal of this paper is to answer both challenges. Four contributions are made to the free-will debate. First, we propose a classification of scientific theories based on how much freedom they allow. We take into account that indeterminism comes in different degrees and that both the laws and the auxiliary conditions can place constraints. A scientific worldview pulls towards one end of this classification, while libertarianism pulls towards the other end of the spectrum. Second, inspired by Hoefer, we argue that an interval of auxiliary conditions corresponds to a region in phase space, and to a bundle of possible block universes. We thus make room for a form of non-nomological indeterminism. Third, we combine crucial elements from the works of Hoefer and List; we attempt to give a libertarian reading of this combination. On our proposal, throughout spacetime, there is a certain amount of freedom (equivalent to setting the initial, intermediate, or final conditions) that can be interpreted as the result of agential choices. Fourth, we focus on the principle of alternative possibilities throughout and propose three ways of strengthening it

    Fair infinite lotteries

    Get PDF
    This article discusses how the concept of a fair finite lottery can best be extended to denumerably infinite lotteries. Techniques and ideas from non-standard analysis are brought to bear on the problem

    New Theory about Old Evidence:A framework for open-minded Bayesianism

    Get PDF
    We present a conservative extension of a Bayesian account of confirmation that can deal with the problem of old evidence and new theories. So-called open-minded Bayesianism challenges the assumption-implicit in standard Bayesianism-that the correct empirical hypothesis is among the ones currently under consideration. It requires the inclusion of a catch-all hypothesis, which is characterized by means of sets of probability assignments. Upon the introduction of a new theory, the former catch-all is decomposed into a new empirical hypothesis and a new catch-all. As will be seen, this motivates a second update rule, besides Bayes' rule, for updating probabilities in light of a new theory. This rule conserves probability ratios among the old hypotheses. This framework allows for old evidence to confirm a new hypothesis due to a shift in the theoretical context. The result is a version of Bayesianism that, in the words of Earman, "keep[s] an open mind, but not so open that your brain falls out"

    Assigning probabilities to non-Lipschitz mechanical systems

    Full text link
    We present a method for assigning probabilities to the solutions of initial value problems that have a Lipschitz singularity. To illustrate the method, we focus on the following toy-example: r¨=rα\ddot{r} = r^\alpha, r(t=0)=0r(t=0) =0, and r˙∣r(t=0)=0\dot{r}\mid_{r(t=0)} =0, where the dots indicate derivatives to time and α∈]0,1[\alpha \in ]0,1[. This example has a physical interpretation as a mass in a uniform gravitational field on a dome of particular shape; the case with α=1/2\alpha=1/2 is known as Norton's dome. Our approach is based on (1) finite difference equations, which are deterministic, (2) a uniform prior on the phase space, and (3) non-standard analysis, which involves infinitesimals and which is conceptually close to numerical methods from physical praxis. This allows us to assign probabilities to the solutions of the initial value problem in the original, indeterministic model.Comment: 13 figure
    • …
    corecore